On this page you can get a detailed analysis of a word or phrase, produced by the best artificial intelligence technology to date:
In mathematics, the limit of a function is a fundamental concept in calculus and analysis concerning the behavior of that function near a particular input.
Formal definitions, first devised in the early 19th century, are given below. Informally, a function f assigns an output f(x) to every input x. We say that the function has a limit L at an input p, if f(x) gets closer and closer to L as x moves closer and closer to p. More specifically, when f is applied to any input sufficiently close to p, the output value is forced arbitrarily close to L. On the other hand, if some inputs very close to p are taken to outputs that stay a fixed distance apart, then we say the limit does not exist.
The notion of a limit has many applications in modern calculus. In particular, the many definitions of continuity employ the concept of limit: roughly, a function is continuous if all of its limits agree with the values of the function. The concept of limit also appears in the definition of the derivative: in the calculus of one variable, this is the limiting value of the slope of secant lines to the graph of a function.